Iterative improvement of a nearest neighbor classifier

نویسندگان

  • Hung-Chun Yau
  • Michael T. Manry
چکیده

In practical pattern recognition applications, the nearest neighbor classifier (NNC) is often applied because it does not require an a priori knowledge of the joint probability density of the input feature vectors. As the number of example vectors is increased, the error probability of the NNC approaches that of the Baysian classifier. However, at the same time, the computational complexity of the NNC increases. Also, for a small number of example vectors, the NNC is not optimal with respect to the training data. In this paper, we attack these problems by mapping the NNC to a sigma-pi neural network, to which it is partially isomorphic. A modified form of back-propagation (BP) learning is then developed and used to improve classifier performance. As examples, we apply our approach to the problems of hand-printed numeral recognition and geometrical shape recognition. Significant improvements in classification error percentages are observed for both the training data and testing data. Send reprint requests to Prof. Michael T. Manry, Department of Electrical Engineering, University of Texas at Arlington, Arlington, Texas 76019. Phone: 817-273-3483.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Comparing pixel-based and object-based algorithms for classifying land use of arid basins (Case study: Mokhtaran Basin, Iran)

In this research, two techniques of pixel-based and object-based image analysis were investigated and compared for providing land use map in arid basin of Mokhtaran, Birjand. Using Landsat satellite imagery in 2015, the classification of land use was performed with three object-based algorithms of supervised fuzzy-maximum likelihood, maximum likelihood, and K-nearest neighbor. Nine combinations...

متن کامل

Stabilized Nearest Neighbor Classifier and Its Statistical Properties

Stability has been of a great concern in statistics: similar statistical conclusions should be drawn based on different data sampled from the same population. In this article, we introduce a general measure of classification instability (CIS) to capture the sampling variability of the predictions made by a classification procedure. The minimax rate of CIS is established for general plug-in clas...

متن کامل

A Modified K-Nearest Neighbor Classifier to Deal with Unbalanced Classes

We present in this paper a simple, yet valuable improvement to the traditional k-Nearest Neighbor (kNN) classifier. It aims at addressing the issue of unbalanced classes by maximizing the class-wise classification accuracy. The proposed classifier also gives the option of favoring a particular class through evaluating a small set of fuzzy rules. When tested on a number of UCI datasets, the prop...

متن کامل

Optimal Iterative Discriminant Analysis In Kernel Space

Kernel trick is a powerful tool being used for solving complex pattern classification problem. As long as a linear feature extraction algorithm can be expressed exclusively by dot-products, it can be extended to non-linear version by combining kernel method. In this paper, we present such an improved iterative algorithm used for linear discriminant analysis. By mapping data onto high dimensiona...

متن کامل

Hilbert Space Filling Curve (hsfc) Nearest Neighbor Classifier

The Nearest Neighbor algorithm is one of the simplest and oldest classification techniques. A given collection of historic data (Training Data) of known classification is stored in memory. Then based on the stored knowledge the classification of an unknown data (Test Data) is predicted by finding the classification of the nearest neighbor. For example, if an instance from the test set is presen...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neural Networks

دوره 4  شماره 

صفحات  -

تاریخ انتشار 1991